12 research outputs found

    Optimizing the k-NN metric weights using differential evolution

    Full text link
    Traditional k-NN classifier poses many limitations including that it does not take into account each class distribution, importance of each feature, contribution of each neighbor, and the number ofinstances for each class. A Differential evolution (DE) optimization technique is utilized to enhance the performance of k-NN through optimizing the metric weights of features, neighbors and classes. Several datasets are used to evaluate the performance of the proposed DE based metrics and to compare it to some k-NN variants from the literature. Practical experiments indicate that in most cases, incorporating DE in k-NN classification can provide more accurate performance. ©2010 IEEE

    Enhancing the diversity of genetic algorithm for improved feature selection

    Full text link
    Genetic algorithm (GA) is one of the most widely used population-based evolutionary search algorithms. One of the challenging optimization problems in which GA has been extensively applied is feature selection. It aims at finding an optimal small size subset of features from the original large feature set. It has been found that the main limitation of the traditional GA-based feature selection is that it tends to get trapped in local minima, a problem known as premature convergence. A number of implementations are presented in the literature to overcome this problem based on fitness scaling, genetic operator modification, boosting genetic population diversity, etc. This paper presents a new modified genetic algorithm based on enhanced population diversity, parents' selection and improved genetic operators. Practical results indicate the significance of the proposed GA variant in comparison to many other algorithms from the literature on different datasets. ©2010 IEEE

    A modified k-nearest neighbor classifier to deal with unbalanced classes

    Full text link
    We present in this paper a simple, yet valuable improvement to the traditional k-Nearest Neighbor (kNN) classifier. It aims at addressing the issue of unbalanced classes by maximizing the class-wise classification accuracy. The proposed classifier also gives the option of favoring a particular class through evaluating a small set of fuzzy rules. When tested on a number of UCI datasets, the proposed algorithm managed to achieve a uniformly good performance

    Intelligent artificial ants based feature extraction from wavelet packet coefficients for biomedical signal classification

    Full text link
    In this paper, a new feature extraction method utilizing ant colony optimization in the selection of wavelet packet transform (WPT) best basis is presented and adopted in classifying biomedical signals. The new algorithm, termed Intelligent Artificial Ants (IAA), searches the wavelet packet tree for subsets of features that best interact together to produce high classification accuracies. While traversing the WPT tree, the IAA takes into account existing correlation between features thus avoiding information redundancy. The IAA method is a mixture of filter and wrapper approaches in feature subset selection. The pheromone that the ants lay down is updated by means of an estimation of the information contents of a single feature or feature subset. The significance of the subsets selected by the ants is measured using linear discriminant analysis (LDA) classifier. The IAA method is tested on one of the most important biosignal driven applications, which is the Brain Computer Interface (BCI) problem with 56 EEG channels. Practical results indicate the significance of the proposed method achieving a maximum accuracy of 83%. ©2008 IEEE

    Enhanced feature selection algorithm using ant Colony Optimization and fuzzy memberships

    Full text link
    Feature selection is an indispensable pre-processing step when mining huge datasets that can significantly improve the overall system performance. This paper presents a novel feature selection method that utilizes both the Ant Colony Optimization (ACO) and fuzzy memberships. The algorithm estimates the local importance of subsets of features, i.e., their pheromone intensities by utilizing fuzzy c-means (FCM) clustering technique. In order to prove the effectiveness of the proposed method, a comparison with another powerful ACO based feature selection algorithm that utilizes the Mutual Information (MI) concept is presented. The method is tested on two biosignals driven applications: Brain Computer Interface (BCI), and prosthetic devices control with myoelectric signals (MES). A linear discriminant analysis (LDA) classifier is used to measure the performance of the selected subsets in both applications. Practical experiments prove that the new algorithm can be as accurate as the original method with MI, but with a significant reduction in computational cost, especially when dealing with huge datasets

    A novel swarm based feature selection algorithm in multifunction myoelectric control

    Full text link
    Accurate and computationally efficient myoelectric control strategies have been the focus of a great deal of research in recent years. Although many attempts exist in literature to develop such strategies, deficiencies still exist. One of the major challenges in myoelectric control is finding an optimal feature set that can best discriminate between classes. However, since the myoelectric signal is recorded using multi channels, the feature vector size can become very large. Hence a dimensionality reduction method is needed to identify an informative, yet small size feature set. This paper presents a new feature selection method based on modifying the Particle Swarm Optimization (PSO) algorithm with the inclusion of Mutual Information (MI) measure. The new method, called BPSOMI, is a mixture of filter and wrapper approaches of feature selection. In order to prove its efficiency, the proposed method is tested against other dimensionality reduction techniques proving powerful classification accuracy. © 2009 - IOS Press and the authors. All rights reserved

    An enhanced neural network ensemble for automatic sleep scoring

    Full text link
    Improving the diversity of Neural Network Ensembles (NNE) plays an important role in creating robust classification systems in many fields. Several methods have been proposed in the literature to create such diversity using different sets of classifiers or using different portions of training/feature sets. Neural networks are often used as base classifiers in multiple classifier systems as they adapt easily to small changes in the training data, therefore creating diversity that is necessary to make the ensemble work. This paper presents a novel algorithm based on generating a set of classifiers such that each one of them is biased towards one of the target classes. This will improve the ensemble diversity and hence its performance. Results on sleep data sets show that the proposed method is able to outperform the traditional fusion algorithms of bagging and boosting. © 2011 IEEE

    Evaluation of feature selection methods for improved EEG classification

    Full text link
    this paper compares several methods for feature selection used in EEG classification. Sequential, heuristics and population-based search methods are compared according to their efficiency and computational cost. A support vector machine classifier has been used to compare accuracies. Effect of the size of feature space has been explored by changing the total number of variables between 27 and 168. Experiments have been conducted to select channels as well as to select individual features from different channels. © 2006 Research Publishing Services

    Feature subset selection using differential evolution and a wheel based search strategy

    Full text link
    Differential evolution has started to attract a lot of attention as a powerful search method and has been successfully applied to a variety of applications including pattern recognition. One of the most important tasks in many pattern recognition systems is to find an informative subset of features that can effectively represent the underlying problem. Specifically, a large number of features can affect the system's classification accuracy and learning time. In order to overcome such problems, we propose a new feature selection method that utilizes differential evolution in a novel manner to identify relevant feature subsets. The proposed method aims to reduce the search space using a simple, yet powerful, procedure that involves distributing the features among a set of wheels. Two versions of the method are presented. In the first one, the desired feature subset size is predefined by the user, while in the second the user only needs to set an upper limit to the feature subset size. Experiments on a number of datasets with different sizes proved that the proposed method can achieve remarkably good results when compared with some of the well-known feature selection methods. © 2012 Elsevier B.V. All rights reserved

    A combined ant colony and differential evolution feature selection algorithm

    Full text link
    Feature selection is an important step in many pattern recognition systems that aims to overcome the so-called curse of dimensionality problem. Although Ant Colony Optimization (ACO) proved to be a powerful technique in different optimization problems, but it still needs some improvements when applied to the feature selection problem. This is due to the fact that it builds its solutions sequentially, where in feature selection this behavior will most likely not lead to the optimal solution. In this paper, a novel feature selection algorithm based on a combination of ACO and a simple, yet powerful, Differential Evolution (DE) operator is presented. The proposed combination enhances both the exploration and exploitation capabilities of the search procedure. The new algorithm is tested on two biosignal-driven applications. The performance of the proposed algorithm is compared with other dimensionality reduction techniques to prove its superiority. © 2008 Springer-Verlag Berlin Heidelberg
    corecore